List of Built-in MetricsΒΆ
This package provides some commonly used metrics. They are implemented in a pre-defined protocols in order to be called by experiments classes. Here is an example.
def accuracy_score(y_true, y_pred, param_dict=None):
"""Accuracy classification score.
Parameters
----------
y_true : 1d array-like, or label indicator array / sparse matrix
Ground truth (correct) _labels.
y_pred : 1d array-like, or label indicator array / sparse matrix
Predicted _labels, as returned by a classifier.
param_dict: dict
A dictory saving the parameters including::
sample_weight : array-like of shape = [n_samples], optional Sample
weights.
Returns
-------
score : float
"""
# codes...
These functions always have three parameters: y_true for ground-truth labels, y_pred for prediction returned by an estimator, param_dict is a dict that stores other parameters used in the function.
The built-in metrics include:
'accuracy_score',
'zero_one_loss',
'roc_auc_score',
'get_fps_tps_thresholds',
'f1_score',
'hamming_loss',
'one_error',
'coverage_error',
'label_ranking_loss',
'label_ranking_average_precision_score',
'micro_auc_score',
'Average_precision_score',
'minus_mean_square_error'
Please refer to s3l.metrics.performance
for more details.